Risk Bounds in Isotonic Regression

نویسنده

  • CUN-HUI ZHANG
چکیده

Nonasymptotic risk bounds are provided for maximum likelihood-type isotonic estimators of an unknown nondecreasing regression function, with general average loss at design points. These bounds are optimal up to scale constants, and they imply uniform n−1/3-consistency of the p risk for unknown regression functions of uniformly bounded variation, under mild assumptions on the joint probability distribution of the data, with possibly dependent observations.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Risk Bounds in Unimodal Regression

We study the statistical properties of the least squares estimator in unimodal sequence estimation. Although closely related to isotonic regression, unimodal regression has not been as extensively studied. We show that the unimodal least squares estimator is adaptive in the sense that the risk scales as a function of the number of values in the true underlying sequence. Such adaptivity properti...

متن کامل

Improved Risk Bounds in Isotonic Regression

We consider the problem of estimating an unknown non-decreasing sequence θ from finitely many noisy observations. We give an improved global risk upper bound for the isotonic least squares estimator (LSE) in this problem. The obtained risk bound behaves differently depending on the form of the true sequence θ – one gets a whole range of rates from log n/n (when θ is constant) to n−2/3 (when θ i...

متن کامل

Application of Isotonic Regression in Predicting Business Risk Scores

an isotonic regression model fits an isotonic function of the explanatory variables to estimate the expectation of the response variable. In other words, as the function increases, the estimated expectation of the response must be non-decreasing. With this characteristic, isotonic regression could be a suitable option to analyze and predict business risk scores. A current challenge of isotonic ...

متن کامل

Sharp oracle bounds for monotone and convex regression through aggregation

We derive oracle inequalities for the problems of isotonic and convex regression using the combination of Q-aggregation procedure and sparsity pattern aggregation. This improves upon the previous results including the oracle inequalities for the constrained least squares estimator. One of the improvements is that our oracle inequalities are sharp, i.e., with leading constant 1. It allows us to ...

متن کامل

Online Isotonic Regression

We consider the online version of the isotonic regression problem. Given a set of linearly ordered points (e.g., on the real line), the learner must predict labels sequentially at adversarially chosen positions and is evaluated by her total squared loss compared against the best isotonic (nondecreasing) function in hindsight. We survey several standard online learning algorithms and show that n...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002